Randomized Block Proximal Methods for Distributed Stochastic Big-Data Optimization
نویسندگان
چکیده
In this paper we introduce a class of novel distributed algorithms for solving stochastic big-data convex optimization problems over directed graphs. the addressed set-up, dimension decision variable can be extremely high and objective function nonsmooth. The general algorithm consists two main steps: consensus step an update on single block variable, which is then broadcast to neighbors. Three special instances proposed method, involving particular problem structures, are presented. case, convergence dynamic random row matrices shown. Then, optimal cost proven in expected value. Exact achieved when using diminishing (local) stepsizes, while approximate attained constant stepsizes employed. rate shown sublinear explicit provided case stepsizes. Finally, tested classification problem, first synthetic data and, then, real, high-dimensional, text dataset.
منابع مشابه
Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization
Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...
متن کاملIntelligent Distributed Processing Methods for Big Data
Motivation Today, “Big Data” is a new information overloading problem in many different areas. Such areas include health cares (e.g., medical records, bioinformatics), e-sciences (e.g., physics, chemistry, and geology), and social sciences (e.g., politics). Thus, as we have various types of feasible data from a number of available sources, it is becoming increasingly more difficult to efficient...
متن کاملStochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
In this paper, we present a new stochastic algorithm, namely the stochastic block mirror descent (SBMD) method for solving large-scale nonsmooth and stochastic optimization problems. The basic idea of this algorithm is to incorporate the block-coordinate decomposition and an incremental block averaging scheme into the classic (stochastic) mirror-descent method, in order to significantly reduce ...
متن کاملAsynchronous Stochastic Proximal Methods for Nonconvex Nonsmooth Optimization
We study stochastic algorithms for solving non-convex optimization problems with a convex yet possibly non-smooth regularizer, which nd wide applications in many practical machine learning applications. However, compared to asynchronous parallel stochastic gradient descent (AsynSGD), an algorithm targeting smooth optimization, the understanding of the behavior of stochastic algorithms for the n...
متن کاملProximal Stochastic Methods for Nonsmooth Nonconvex Finite-Sum Optimization
We analyze stochastic algorithms for optimizing nonconvex, nonsmooth finite-sum problems, where the nonsmooth part is convex. Surprisingly, unlike the smooth case, our knowledge of this fundamental problem is very limited. For example, it is not known whether the proximal stochastic gradient method with constant minibatch converges to a stationary point. To tackle this issue, we develop fast st...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Automatic Control
سال: 2021
ISSN: ['0018-9286', '1558-2523', '2334-3303']
DOI: https://doi.org/10.1109/tac.2020.3027647